Comparative Studies of Support Vector Regression between Reproducing Kernel and Gaussian Kernel
نویسندگان
چکیده
Support vector regression (SVR) has been regarded as a state-of-the-art method for approximation and regression. The importance of kernel function, which is so-called admissible support vector kernel (SV kernel) in SVR, has motivated many studies on its composition. The Gaussian kernel (RBF) is regarded as a “best” choice of SV kernel used by non-expert in SVR, whereas there is no evidence, except for its superior performance on some practical applications, to prove the statement. Its well-known that reproducing kernel (R.K) is also a SV kernel which possesses many important properties, e.g. positive definiteness, reproducing property and composing complex R.K by simpler ones. However, there are a limited number of R.Ks with explicit forms and consequently few quantitative comparison studies in practice. In this paper, two R.Ks, i.e. SV kernels, composed by the sum and product of a translation invariant kernel in a Sobolev space are proposed. An exploratory study on the performance of SVR based general R.K is presented through a systematic comparison to that of RBF using multiple criteria and synthetic problems. The results show that the R.K is an equivalent or even better SV kernel than RBF for the problems with more input variables (more than 5, especially more than 10) and higher nonlinearity. Keywords—admissible support vector kernel, reproducing kernel, reproducing kernel Hilbert space, support vector regression.
منابع مشابه
A New Composition Method of Admissible Support Vector Kernel Based on Reproducing Kernel
Kernel function, which allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, makes the Support Vector Machines (SVM) have been successfully applied in many fields, e.g. classification and regression. The importance of kernel has motivated many studies on its composition. It’s well-known that reproducing kernel (R.K) is a useful kernel function ...
متن کاملSome Properties of Reproducing Kernel Banach and Hilbert Spaces
This paper is devoted to the study of reproducing kernel Hilbert spaces. We focus on multipliers of reproducing kernel Banach and Hilbert spaces. In particular, we try to extend this concept and prove some related theorems. Moreover, we focus on reproducing kernels in vector-valued reproducing kernel Hilbert spaces. In particular, we extend reproducing kernels to relative reproducing kernels an...
متن کاملPredicting the Young\'s Modulus and Uniaxial Compressive Strength of a typical limestone using the Principal Component Regression and Particle Swarm Optimization
In geotechnical engineering, rock mechanics and engineering geology, depending on the project design, uniaxial strength and static Youngchr('39')s modulus of rocks are of vital importance. The direct determination of the aforementioned parameters in the laboratory, however, requires intact and high-quality cores and preparation of their specimens have some limitations. Moreover, performing thes...
متن کاملRelationships between Gaussian processes, Support Vector machines and Smoothing Splines
Bayesian Gaussian processes and Support Vector machines are powerful kernel-based methods to attack the pattern recognition problem. Probably due to the very different philosophies of the fields they have been originally proposed in, techniques for these two models have been developed somewhat in isolation from each other. This tutorial paper reviews relationships between Bayesian Gaussian proc...
متن کاملSubspace Regression in Reproducing Kernel Hilbert Space
We focus on three methods for finding a suitable subspace for regression in a reproducing kernel Hilbert space: kernel principal component analysis, kernel partial least squares and kernel canonical correlation analysis and we demonstrate how this fits within a more general context of subspace regression. For the kernel partial least squares case a least squares support vector machine style der...
متن کامل